AttnRes to replace Transformer's fixed residual combination with softmax attention in the depth direction. Demonstration with Kimi Linear 48B improved GPQA-Diamond +7.5pt and HumanEval +3.1pt. Training overhead was kept below 4% and inference below 2%.